In this paper, we develop an approach to exploiting kernel methods withmanifold-valued data. In many computer vision problems, the data can benaturally represented as points on a Riemannian manifold. Due to thenon-Euclidean geometry of Riemannian manifolds, usual Euclidean computer visionand machine learning algorithms yield inferior results on such data. In thispaper, we define Gaussian radial basis function (RBF)-based positive definitekernels on manifolds that permit us to embed a given manifold with acorresponding metric in a high dimensional reproducing kernel Hilbert space.These kernels make it possible to utilize algorithms developed for linearspaces on nonlinear manifold-valued data. Since the Gaussian RBF defined withany given metric is not always positive definite, we present a unifiedframework for analyzing the positive definiteness of the Gaussian RBF on ageneric metric space. We then use the proposed framework to identify positivedefinite kernels on two specific manifolds commonly encountered in computervision: the Riemannian manifold of symmetric positive definite matrices and theGrassmann manifold, i.e., the Riemannian manifold of linear subspaces of aEuclidean space. We show that many popular algorithms designed for Euclideanspaces, such as support vector machines, discriminant analysis and principalcomponent analysis can be generalized to Riemannian manifolds with the help ofsuch positive definite Gaussian kernels.
展开▼